Neural Nets based Predictive Prefetching to Tolerate WWW Latency

نویسندگان

  • Tamer I. Ibrahim
  • Cheng-Zhong Xu
چکیده

With the explosive growth of WWW applications on the Internet, users are experiencing access delays more often than ever. Recent studies showed that pre-fetching could alleviate the WWW latency to a larger extent than caching. Existing pre-fetching methods are mostly based on URL graphs. They use the graphical nature of hypertext links to determine the possible paths through a hypertext system. While they have been demonstrated effective in pre-fetching of documents that are often accessed, they are incapable of pre-retrieving documents whose URLs had never been accessed. In this paper, we propose a context-specific pre-fetching technique to overcome the limitation. It relies on keywords in anchor texts of URLs to characterize user access patterns and on neural networks over the keyword set to predict future requests. It features a selflearning capability and good adaptivity to the change of user surfing interest. The technique was implemented in a SmartNewsReader system and cross-examined in a daily browsing of MSNBC and CNN news sites. The experimental results showed an achievement of approximately 60% hit ratio due to pre-fetching. Of the pre-fetched documents, less than 30% was undesired.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Latency Reduction and Tolerance in Distributed Digital Libraries

Caching and prefetching are well known to reduce and tolerate latency in distributed systems such as the World Wide Web (WWW). In the scientific world those techniques could be used for digital libraries, where latency is experienced in the retrieval of publications from publishers. This thesis is an investigation into the prospects of applying caching and prefetching in the context of distribu...

متن کامل

CS 533 Computer Networks Term Project: Prefetching on WWW – its past and future

This paper summarizes the various research efforts on prefetching schemes in WWW. We try to classify and rate the previous methods and results on this topic and point out the potential works. We will proceed from several perspectives: questions addressed in the past, methods that have been used, advantages and limitations, comparison of different results, open questions, new changes in Internet...

متن کامل

Web Access Latency Reduction Using CRF-Based Predictive Caching

Reducing the Web access latency perceived by a Web user has become a problem of interest. Web prefetching and caching are two effective techniques that can be used together to reduce the access latency problem on the Internet. Because the success of Web prefetching mainly relies on the prediction accuracy of prediction methods, in this paper we employ a powerful sequential learning model, Condi...

متن کامل

Proxy-based prefetching and pushing of web resources Proxy-baserad prefetching och pushing av web resurser

The use of WWW is more prevalent now than ever. Latency has a significant impact on the WWW, with higher latencies causing longer loading time of webpages. On the other hand, if we can lower the latency, we will lower the loading time of a webpage. Latencies are often caused by data traveling long distances or through gateways that add additional processing delays to the forwarded packets. In t...

متن کامل

An Efficient Approach For Optimal Prefetching To Reduce Web Access Latency

The exponential growth and popularity of WWW increases the amount of traffic which results in major congestion problems over the available bandwidth for the retrieval of data. This results in the increase of user perceived latency. Prefetching of web pages is a potential area that can significantly reduce the web access latency. It refers to the mechanism of deducing the forthcoming page access...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000